# Small parameter efficiency
Flarenew
MIT
LaMini-Flan-T5-783M is a lightweight text generation model based on the T5 architecture, supporting English, Russian, and Ukrainian, suitable for various natural language processing tasks.
Large Language Model
Transformers Supports Multiple Languages

F
HaveAI
718
1
Reasonablellama3 3B Jr
A fine-tuned reasoning model based on LLaMA-3B, enhanced with reasoning capabilities and multilingual processing support
Large Language Model Supports Multiple Languages
R
adeelahmad
1,173
6
Open Cabrita3b GGUF
Apache-2.0
Open Cabrita 3B is an open-source large language model optimized for Portuguese, based on the LLaMA architecture, designed to narrow the performance gap between foreign language and English models.
Large Language Model Other
O
lucianosb
352
6
Norbert3 Xs
Apache-2.0
NorBERT 3 xs is a BERT model optimized for Norwegian, the smallest version in the new generation NorBERT language model series with 15M parameters.
Large Language Model
Transformers Other

N
ltg
228
4
Gpt2 Small Indonesian
Indonesian generative model pre-trained based on causal language modeling objectives, trained on TPUv3-8 using the Flax framework
Large Language Model Other
G
flax-community
290
5
Klue Roberta Small Nli Sts
This is a Korean sentence transformer model based on KLUE-RoBERTa-small, specifically designed for sentence similarity calculation and natural language inference tasks.
Text Embedding
Transformers Korean

K
ddobokki
141
4
Gpt2
Indonesian generation model pre-trained based on causal language modeling objectives, trained using the Flax framework
Large Language Model Other
G
indonesian-nlp
130
11
Featured Recommended AI Models